Description of the minimizers of least squares regularized with l0-norm. Uniqueness of the global minimizer

نویسنده

  • Mila Nikolova
چکیده

We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an in-depth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought-after solution, weighted by a regularization parameter β > 0. For several decades, this objective has attracted a ceaseless effort to conceive algorithms approaching a good minimizer. Our theoretical contributions, summarized below, shed new light on the existing algorithms and can help the conception of innovative numerical schemes. To solve the normal equation associated with any M-row submatrix of A is equivalent to compute a local minimizer û of Fd. (Local) minimizers û of Fd are strict if and only if the submatrix, composed of those columns of A whose indexes form the support of û, has full column rank. An outcome is that strict local minimizers of Fd are easily computed without knowing the value of β. Each strict local minimizer is linear in data. It is proved that Fd has global minimizers and that they are always strict. They are studied in more details under the (standard) assumption that rank(A) = M < N. The global minimizers with M-length support are seen to be impractical. Given d, critical values βK for any K 6 M − 1 are exhibited such that if β > βK, all global minimizers of Fd are K-sparse. An assumption on A is adopted and proved to fail only on a closed negligible subset. Then for all data d beyond a closed negligible subset, the objective Fd for β > βK, K 6 M − 1, has a unique global minimizer and this minimizer is K-sparse. Instructive small-size (5 × 10) numerical illustrations confirm the main theoretical results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A pr 2 01 3 Description of the minimizers of least squares regularized with l 0 - norm . Uniqueness of the global minimizer

We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an indepth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought after solution, weighted by a regularization paramete...

متن کامل

Description of the Minimizers of Least Squares Regularized with 퓁0-norm. Uniqueness of the Global Minimizer

We have an M × N real-valued arbitrary matrix A (e.g. a dictionary) with M < N and data d describing the sought-after object with the help of A. This work provides an in-depth analysis of the (local and global) minimizers of an objective function Fd combining a quadratic data-fidelity term and an l0 penalty applied to each entry of the sought-after solution, weighted by a regularization paramet...

متن کامل

Stability of Minimizers of Regularized Least Squares Objective Functions Ii: Study of the Global Behavior

We address estimation problems where the sought-after solution is defined as the minimizer of an objective function composed of a quadratic data-fidelity term and a regularization term. We especially focus on nonsmooth and/or nonconvex regularization terms because of their ability to yield good estimates. This work is dedicated to the stability of the minimizers of such nonsmooth and/or nonconv...

متن کامل

Relationship between the optimal solutions of least squares regularized with L0 -norm and constrained by k-sparsity

Two widely used models to find a sparse solution from a noisy underdetermined linear system are the constrained problem where the quadratic error is minimized subject to a sparsity constraint, and the regularized problem where a regularization parameter balances the minimization of both quadratic error and sparsity. However, the connections between these two problems have remained unclear so fa...

متن کامل

Approximate L0 constrained non-negative matrix and tensor factorization

Non-negative matrix factorization (NMF), i.e. V ≈ WH where both V, W and H are non-negative has become a widely used blind source separation technique due to its part based representation. The NMF decomposition is not in general unique and a part based representation not guaranteed. However, imposing sparseness both improves the uniqueness of the decomposition and favors part based representati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014